LMS Algorithm Step Size Adjustment for Fast Convergence

نویسنده

  • Dariusz BISMOR
چکیده

In the areas of acoustic research or applications that deal with not-precisely-known or variable conditions, a method of adaptation to the uncertainness or changes is usually necessary. When searching for an adaptation algorithm, it is hard to overlook the least mean squares (LMS) algorithm. Its simplicity, speed of computation, and robustness has won it a wide area of applications: from telecommunication, through acoustics and vibration, to seismology. The algorithm, however, still lacks a full theoretical analysis. This is probabely the cause of its main drawback: the need of a careful choice of the step size – which is the reason why so many variable step size flavors of the LMS algorithm has been developed. This paper contributes to both the above mentioned characteristics of the LMS algorithm. First, it shows a derivation of a new necessary condition for the LMS algorithm convergence. The condition, although weak, proved useful in developing a new variable step size LMS algorithm which appeared to be quite different from the algorithms known from the literature. Moreover, the algorithm proved to be effective in both simulations and laboratory experiments, covering two possible applications: adaptive line enhancement and active noise control.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Wavelet Transform-Domain LMS Adaptive Filter Algorithm with Variable Step-Size

The wavelet transform-domain least-mean square (WTDLMS) algorithm uses the self-orthogonalizing technique to improve the convergence performance of LMS. In WTDLMS algorithm, the trade-off between the steady-state error and the convergence rate is obtained by the fixed step-size. In this paper, the WTDLMS adaptive algorithm with variable step-size (VSS) is established. The step-size in each subf...

متن کامل

Review and Comparison of Variable Step-Size LMS Algorithms

The inherent feature of the Least Mean Squares (LMS) algorithm is the step size, and it requires careful adjustment. Small step size, required for small excess mean square error, results in slow convergence. Large step size, needed for fast adaptation, may result in loss of stability. Therefore, many modifications of the LMS algorithm, where the step size changes during the adaptation process d...

متن کامل

Analysis the results of Acoustic Echo Cancellation for speech processing using LMS Adaptive Filtering Algorithm

The Conventional acoustic echo canceller encounters problems like slow convergence rate (especially for speech signal) and high computational complexity as the identification of the echo path requires filter with more than a thousand taps, non-stationary speech input, slowly time-varying systems to be identified. The demand for fast convergence and less MSE level cannot be met by conventional a...

متن کامل

Non Stationary Noise Removal from Speech Signals using Variable Step Size Strategy

The aim of this paper is to implement various adaptive noise cancellers (ANC) for speech enhancement based on gradient descent approach, namely the least-mean square (LMS) algorithm and then enhanced to variable step size strategy. In practical application of the LMS algorithm, a key parameter is the step size. As is well known, if the step size is large, the convergence rate of the LMS algorit...

متن کامل

A Simple and Fast Converging Algorithm for MMSE Adap- tive Array Antenna

Manuscript received December 10, 1999. Manuscript revised March 18, 2000. † The authors are with the Telecom Research Laboratory, Kyushu Matsushita Electric Co., Ltd., Fukuoka-shi, 812-8531 Japan. SUMMARY This paper proposes a fast and simple adaptive algorithm for MMSE (Minimum Mean Square Error) adaptive array antenna or MMSE combining diversity. This algorithm can be implemented with as a sm...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012